Goto

Collaborating Authors

 Alton


WavePulse: Real-time Content Analytics of Radio Livestreams

Mittal, Govind, Gupta, Sarthak, Wagle, Shruti, Chopra, Chirag, DeMattee, Anthony J, Memon, Nasir, Ahamad, Mustaque, Hegde, Chinmay

arXiv.org Artificial Intelligence

Radio remains a pervasive medium for mass information dissemination, with AM/FM stations reaching more Americans than either smartphone-based social networking or live television. Increasingly, radio broadcasts are also streamed online and accessed over the Internet. We present WavePulse, a framework that records, documents, and analyzes radio content in real-time. While our framework is generally applicable, we showcase the efficacy of WavePulse in a collaborative project with a team of political scientists focusing on the 2024 Presidential Elections. We use WavePulse to monitor livestreams of 396 news radio stations over a period of three months, processing close to 500,000 hours of audio streams. These streams were converted into time-stamped, diarized transcripts and analyzed to track answer key political science questions at both the national and state levels. Our analysis revealed how local issues interacted with national trends, providing insights into information flow. Our results demonstrate WavePulse's efficacy in capturing and analyzing content from radio livestreams sourced from the Web. Code and dataset can be accessed at \url{https://wave-pulse.io}.


Massive sinkhole collapses soccer field at Illinois park

FOX News

A massive sinkhole opened up at a soccer field in Alton, Illinois, on Wednesday. A 100-foot-wide sinkhole opened beneath a soccer field in Illinois on Wednesday as a result of a collapse at a nearby underground mine, officials said. The sinkhole formed at around 10 a.m. at Gordon Moore Park in Alton. Surveillance video from the City of Alton shows the moment the sinkhole opens and swallows a light pole on the field in a cloud of dust. Drone video shows the aftermath of the crater in the center of the field.


Linearity of Relation Decoding in Transformer Language Models

Hernandez, Evan, Sharma, Arnab Sen, Haklay, Tal, Meng, Kevin, Wattenberg, Martin, Andreas, Jacob, Belinkov, Yonatan, Bau, David

arXiv.org Artificial Intelligence

Much of the knowledge encoded in transformer language models (LMs) may be expressed in terms of relations: relations between words and their synonyms, entities and their attributes, etc. We show that, for a subset of relations, this computation is well-approximated by a single linear transformation on the subject representation. Linear relation representations may be obtained by constructing a first-order approximation to the LM from a single prompt, and they exist for a variety of factual, commonsense, and linguistic relations. However, we also identify many cases in which LM predictions capture relational knowledge accurately, but this knowledge is not linearly encoded in their representations. Our results thus reveal a simple, interpretable, but heterogeneously deployed knowledge representation strategy in transformer LMs.